Mathematical Methods in Reliability
نویسندگان
چکیده
Natural constructions of hazard and intensity functions Prof. Dave Percy (Salford) We investigate the reliability distributions for variants of standard hazard functions that model periodicity, impulses, amplifications, damping, orders and extrema. We also consider parallel-series system configurations and the resulting implications for intensity functions of complex systems, concluding with a discussion of prior elicitation and practical applications. Monitoring using Event History Data Dr Axel Gandy (Imperial, London) This talk discusses how survival analysis / event history models can be used for monitoring purposes. In particular, CUSUM charts based on partial likelihood ratios will be discussed. As most control charts, this method needs an alarm threshold. Calibration of the threshold of charts to achieve a desired in-control property (e.g. average run length, false alarm probability) often ignores that the in-control distribution is usually only estimated. A method to take account of the estimation error when calibrating charts will be suggested. Warranty data analysis: new results and challenges Dr Shaomin Wu (Cranfield) Warranty claims and supplementary data contain useful information about product quality and reliability. Analysing such data can therefore be of benefit to manufacturers in identifying early warnings of abnormalities in their products, providing useful information about failure modes to aid design modification, estimating product reliability for deciding on warranty policy, and forecasting future warranty claims needed for preparing fiscal plans. In the last two decades, considerable research has been conducted in warranty data analysis (WDA) from several different perspectives. This presentation attempts to report our newly developed approaches to warranty forecasting and some existing challenges. Multivariate quantile-quantile plots and related tests Dr Subhra sankar Dhar (Cambridge) The univariate quantile-quantile (Q-Q) plot is a well-known graphical tool for examining whether two data sets are generated from the same distribution or not. It is also used to determine how well a specified probability distribution fits a given sample. In this talk, we will develop and study a multivariate version of Q-Q plot based on spatial quantiles (see Chaudhuri (1996), JASA). The usefulness of the proposed graphical device will be illustrated on different real and simulated data, some of which have fairly large dimensions. We will also develop certain statistical tests that are related to the proposed multivariate Q-Q plots and study their asymptotic properties. The performance of those tests compared to some other well-known tests for multivariate distributions will be discussed also. This is a joint work with Biman Chakraborty and Probal Chaudhuri. Joint modelling of longitudinal and survival analysis Dr Yanchun Bao, (Brunel) In survival analysis, time-dependent covariates are usually present as longitudinal data collected periodically and measured with error. The longitudinal data can be assumed to follow a linear mixed effect model and Cox regression models may be used for modelling of survival events. The hazard rate of survival times depends on the underlying time-dependent covariate measured with error, which may be described by random effects. Most existing methods proposed for such models assume a parametric distribution assumption on the random effects and specify a normally distributed error term for the linear mixed effect model. These assumptions may not be always valid in practice. In this paper we propose a new likelihood method for Cox regression models with error-contaminated time-dependent covariates. The proposed method does not require any parametric distribution assumption on random effects and random errors. Asymptotic properties for parameter estimators are provided. Simulation results show that under certain situations the proposed methods are more efficient than the existing methods. Nonparametric predictive inference for reliability of coherent system Prof. Frank Coolen (Durham) Nonparametric predictive inference (NPI) is a statistical method using relatively few modelling assumptions, enabled through the use of lower and upper probabilities to quantify uncertainty. In this talk, lower and upper survival functions for coherent systems are presented, based on test results in the form of failures times of components exchangeable with those in the system. As it is a data-driven approach, such test failure times must be available for each type of component in the system. It is also shown how partial knowledge of the system structure can be used, which has the advantage of possibly reducing the computational effort in case of a specific reliability target. (Joint work with Ahmad Aboalkhair, Abdullah Al-nefaieeh and Tahani Coolen-Maturi) Optimal design for censored lifetime experiments Dr Alan Kimber (Southampton) Censoring may occur in many industrial or biomedical 'time to event' experiments. Efficient designs for such experiments are needed but finding such designs can be problematic since the statistical models involved will usually be nonlinear, making the optimal choice of design parameter dependent. We provide analytical characterisations of locally Dand c-optimal designs for a class of models that includes the natural proportional hazards parameterisation of the exponential regression model, thus reducing the numerical effort for design search substantially. Links to designs for the semi-parametric Cox proportional hazards model are also discussed. Small sample inference of GPD with application in MTTF and Volatility Mr Zhuo Sheng (Brunel) Exact statistics inference of the generalised Pareto distribution (GPD) under the extreme value theory is often subjected to small sample sizes. The estimation becomes even difficult for extremely high quantiles of the GPD while quantiles are very useful measures in risk analysis with application in in MTTF (mean time to failure) and Volatility. Survival Models and Threshold Crossings Prof. Martin Newby (City, London) Many situations can be describe as a threshold crossing problem. Problems of wear and degradation, stock control and illness all can be described using threshold crossings. State dependent maintenance also demands the analysis of this kind of model. The distribution of the time to failure is determined as a first hitting distribution. I look at a few concrete examples and show some results based on elementary arguments about thinning of stochastic processes. The models derived also provide an alternative to competing risks for multiple causes of failure. Accelerated failure time models for censored survival data under referral bias Dr Hongsheng Dai (Brighton) The estimation of progression to liver cirrhosis and identifying its risk factors are often of epidemiological interests in hepatitis C natural history study. In most hepatitis C cohort studies, patients were usually recruited to the cohort with referral bias because clinically the patients with more rapid disease progression were preferentially referred to liver clinics. A pair of correlated event times may be observed for each patient, time to development of cirrhosis and time to referral to a cohort. This paper considers accelerated failure time models to study the effects of covariates on progression to cirrhosis. A new non-parametric estimator is proposed to handle a flexible bivariate distribution of the cirrhosis and referral times and to take the referral bias into account. The asymptotic normality of the proposed coefficient estimator is also provided. Numerical studies show that the coefficient estimator and its covariance function estimator perform well. Review and Future Plans Prof. Frank Coolen, Prof. Dave Percy, Prof. John Quigley, Dr Keming Yu This meeting is part of a joint research group on Mathematical Methods in Reliability, involving Durham, Strathclyde, Brunel and Salford Universities and sponsored by a Scheme 3 grant from the London Mathematical Society. For more information about related events see: http://maths.dur.ac.uk/stats/people/fc/LMS-Reliability.html
منابع مشابه
Mathematical modeling and fuzzy availability analysis for serial processes in the crystallization system of a sugar plant
The binary states, i.e., success or failed state assumptions used in conventional reliability are inappropriate for reliability analysis of complex industrial systems due to lack of sufficient probabilistic information. For large complex systems, the uncertainty of each individual parameter enhances the uncertainty of the system reliability. In this paper, the concept of fuzzy reliability...
متن کاملOptimization Methods for Power Grid Reliability
Optimization Methods for Power Grid Reliability
متن کاملTechniques for Modeling the Reliability of Fault-Tolerant Systems With the Markov State-Space Approach
This paper presents a step-by-step tutorial of the methods and the tools that were used for the reliability analysis of fault-tolerant systems. The approach of this paper is the Markov (or semi-Markov) state-space method. The paper is intended for design engineers with a basic understanding of computer architecture and fault tolerance, but little knowledge of reliability modeling. The represent...
متن کاملTesting the untestable: reliability in the 21st century
As science and technology become increasingly sophisticated, government and industry are relying more and more on science’s advanced methods to determine reliability. Unfortunately, political, economic, time, and other constraints imposed by the real world inhibit the ability of researchers to calculate reliability efficiently and accurately. Because of such constraints, reliability must underg...
متن کاملReliability for the 21 Century
The sophistication of science and technology is growing almost exponentially. Government and industry are relying more and more on science’s advanced methods to assess reliability coupled with performance, safety, surety, cost, schedule, etc. Unfortunately, policy, cost, schedule, and other constraints imposed by the real world inhibit the ability of researchers to calculate these metrics effic...
متن کاملIncreasing the Reliability and the Profit in a Redundancy Allocation Problem
This paper proposes a new mathematical model for multi-objective redundancy allocation problem (RAP) without component mixing in each subsystem when the redundancy strategy can be chosen for individual subsystems. Majority of the mathematical model for the multi-objective redundancy allocation problems (MORAP) assume that the redundancy strategy for each subsystem is predetermined and fixed...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012